Money Mule Detection

Mules don't get caught one transaction at a time. They get caught in the ring.

Continuous. Connected. Explainable.

Soft-kernel scoring instead of cliff-edge rules. A temporal graph instead of isolated alerts. Counterfactual evidence instead of opaque scores — STR-ready, by default.

AMBIENT // OBSERVING
SARASW · ER ENGINE
// THE ILLUSION · 3 SAFE BANKS
Mule ring identified · 14 minutes
Coordinated ring · 14 accounts, 3 banks, 1 herder
HERDER H-0817
SYNCHRONY ±82 ms
EXPOSURE $2.4M
SC-19 0.97
Today's Reality

A thousand rules. And the same three problems.

Every threshold is a cliff. Mules learn it. Honest customers fall off it. Investigators stitch context across systems by hand. By the time the ring is mapped, the funds are layered three accounts deep — and the regulator wants to know why the rule that was tuned last quarter missed it.

Rules-based TM
KYC / Customer 360
Device & Session
Watchlist Engine
Adverse Media
Cross-bank Network
"We tune the rules. They learn the rules. We tune again." — Head of AML Analytics, Tier-1 Bank
With ContexQ

Six topologies. One temporal graph. Continuous scoring.

Every disruption pattern in mule fraud — pass-through flows, threshold evasion, ring topologies, device clusters, identity collisions, watchlist contagion — runs as a soft-kernel scorecard on the same temporal graph. No cliffs to tune around. The composite stays calibrated as mules adapt.

All sensors live · 6 typology classes monitored LIVE
ContexQ
Composite SC-19 · 14:32:08.014
Ring M-2841 · 1 herder + 11 mules · $4.2M exposure
Score 0.94 Latency 14ms STR ready

Catching the mule isn't the win. Catching the ring is.

Most rules engines flag a single suspicious payment, queue it for review, and lose the ring around it. By the time a human stitches the network, the funds are layered three accounts deep and the herder has spun up a new wave.

Rules-Engine Reaction · Days to Weeks

By the time the ring is mapped, the funds are gone.

DAY 0
Rule fires
DAY 2
Triage queue
DAY 5
Manual stitching
DAY 8
Funds layered
DAY 14
STR filed late
!
Money laundered. Ring grows. Regulator comments.
High false positives, late STR, no map of the wider ring — and the rule that missed it gets re-tuned next quarter.
ContexQ · Milliseconds to Hours

The ring surfaces before the trigger. The evidence ships during it.

T −2h
Cluster forming
T 0
Trigger payment
+ 14 ms
Composite ≥ 0.9
+ 15 min
Block + STR draft
+ 1 hr
Case closed
T −2h SC-09 synchrony · 12 accounts pulse together · shared device Watch
T 0 Trigger · $58k cross-border · pass-through pattern Alert
+14ms Composite 0.94 · ring M-2841 resolved · herder identified Resolved
+15:00 Block approved · STR draft generated · evidence attached Done
Funds preserved. Ring mapped. STR auto-drafted with kernel-level evidence.
Composite score traces back to every contributing kernel — investigators read the why, regulators read the audit.

The hard part isn't the AI. It's the soft kernel underneath it.

Real mule detection lives in the math between rules: continuous membership instead of binary cliffs, peer cohorts learned nightly, scorecards ensembling into a composite that traces back to evidence. A model dropped on raw transactions answers fast. It also misses the ring.

01
Streaming Data Fabric
INPUT LAYER
Payments / Kafka Telemetry Graph / Neo4j Watchlists CQ
Real-time payments, session telemetry, watchlists and consortium feeds — streaming through Kafka into a temporal graph and a per-cohort feature store. Half-lives, peer means and σ refreshed nightly.
02
Entity Resolution
THE KEY UNLOCK
John Smith · 4421 J. Smyth · 8902 Jonathan A. S · 1547 Single Customer 1 person · 3 accounts ! shared device
Three "distinct" customers resolved into one operator behind three accounts. Across spelling, phone, biometric, device fingerprint. The signature mule onboarding pattern.
03
Temporal Mule Graph
CONTEXT LAYER
Cust. Account Device Beneficiary Cash-out ! HERDER HUB 2-hop traversal
Resolved customer plus their network — accounts, devices, beneficiaries — surfacing the herder hub two hops out by graph centrality and community taint.
04
Composite Decisioning
DECISION LAYER
SC-19 · RANKED Freeze ring · 12 accts risk −94% · conf 96% A Step-up auth · monitor risk −45% · conf 78% B Soft-flag & watch risk −10% · conf 60% C
Twenty soft-kernel scorecards ensemble into a single composite (SC-19) with counterfactual evidence (SC-20). Audit trail from outcome back to every contributing kernel.
A rules engine
for mules
Draws a line. Mules walk around it. You re-tune. They re-learn. Every quarter. The cliff-edge isn't a bug of rules engines — it is the rules engine.
Soft Kernels
Continuous membership functions, not binary cliffs. The 11-minute mule scores almost the same as the 10-minute one — no edge to operate around.
Entity Resolution
One person resolved across N accounts, N devices, N profiles. Across spelling, language, biometric and behavioral fingerprint.
Ring Detection
Cycles, herder centrality, community taint — surfaced from the temporal graph in milliseconds, not in next-day batch.
STR-Ready Evidence
Counterfactual explanations and contributing-feature traces on every score. Investigators read the why; regulators read the audit.

We catch the bot by the rhythm, not the password.

Twenty soft-kernel scorecards, ensembling into a composite. Three of them tell the story: the rhythm of behaviour, the no-cliff curve, and the fan-in that exposes the herder.

// SC-11 — THE RHYTHM

Behavioural Biometrics

// VERIFIED HUMAN σ = 0.42 // CRIMINAL SCRIPT σ = 0.03
Catch the bot by the rhythm, not the password.
Cadence · Drift · Pressure
// SC-03 — NO CLIFF

Soft-Kernel Scoring

$8K $10K $12K HARD RULE CLIFF $9,999 ≈ $10,001
No cliff. Criminals can't hide one rupee under.
Continuous · Bayesian · Peer-cohort
// SC-16 — THE HUB

Mule-Herder Hub

H HERDER · H-0817 FAN-IN · 50 CENTRALITY · 0.94
Find the controller, not the account.
Fan-In · PageRank · Betweenness